SBERT-WK: A Sentence Embedding Method by Dissecting BERT-Based Word Models

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Negative-Sampling Word-Embedding Method

The word2vec software of Tomas Mikolov and colleagues has gained a lot of traction lately, and provides state-of-the-art word embeddings. The learning models behind the software are described in two research papers [1, 2]. We found the description of the models in these papers to be somewhat cryptic and hard to follow. While the motivations and presentation may be obvious to the neural-networks...

متن کامل

Dissecting psychiatric spectrum disorders by generative embedding☆☆☆

This proof-of-concept study examines the feasibility of defining subgroups in psychiatric spectrum disorders by generative embedding, using dynamical system models which infer neuronal circuit mechanisms from neuroimaging data. To this end, we re-analysed an fMRI dataset of 41 patients diagnosed with schizophrenia and 42 healthy controls performing a numerical n-back working-memory task. In our...

متن کامل

Spectral Graph-Based Method of Multimodal Word Embedding

In this paper, we propose a novel method for multimodal word embedding, which exploit a generalized framework of multiview spectral graph embedding to take into account visual appearances or scenes denoted by words in a corpus. We evaluated our method through word similarity tasks and a concept-to-image search task, having found that it provides word representations that reflect visual informat...

متن کامل

A Feature Extraction Method Based on Word Embedding for Word Similarity Computing

In this paper, we introduce a new NLP task similar to word expansion task or word similarity task, which can discover words sharing the same semantic components (feature sub-space) with seed words. We also propose a Feature Extraction method based on Word Embeddings for this problem. We train word embeddings using state-of-the-art methods like word2vec and models supplied by Stanford NLP Group....

متن کامل

Evaluation method of word embedding by roots and affixes

Word embedding has been shown to be remarkably effective in a lot of Natural Language Processing tasks. However, existing models still have a couple of limitations in interpreting the dimensions of word vector. In this paper, we provide a new approach—roots and affixes model(RAAM)—to interpret it from the intrinsic structures of natural language. Also it can be used as an evaluation measure of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE/ACM Transactions on Audio, Speech, and Language Processing

سال: 2020

ISSN: 2329-9290,2329-9304

DOI: 10.1109/taslp.2020.3008390